Inner-Iteration Krylov Subspace Methods for Least Squares Problems
نویسندگان
چکیده
Stationary inner iterations in combination with Krylov subspace methods are proposed for least squares problems. The inner iterations are efficient in terms of computational work and memory, and serve as powerful preconditioners also for ill-conditioned and rank-deficient least squares problems. Theoretical justifications for using the inner iterations as preconditioners are presented. Numerical experiments for overdetermined least squares problems including ill-conditioned and rank-deficient problems show that the proposed methods outperform previous methods.
منابع مشابه
Preconditioned Krylov subspace methods for the solution of least-squares problems
and Kk(BA,Br) = span{Br, (BA)Br, . . . , (BA)k−1Br}, (3) where B ∈ Rn×m is the mapping and preconditioning matrix, and apply Krylov subspace iteration methods on these subspaces. For overdetermined problems, applying the standard CG method to Kk(BA,Br) leads to the preconditioned CGLS [3] or CGNR [9] method while for underdetermined problems it leads to preconditioned CGNE [9] method. The GMRES...
متن کاملTSIRM: A two-stage iteration with least-squares residual minimization algorithm to solve large sparse linear and nonlinear systems
In this paper, a two-stage iterative algorithm is proposed to improve the convergence of Krylov based iterative methods, typically those of GMRES variants. The principle of the proposed approach is to build an external iteration over the Krylov method, and to frequently store its current residual (at each GMRES restart for instance). After a given number of outer iterations, a least-squares min...
متن کاملMultisplitting for regularized least squares with Krylov subspace recycling
The method of multisplitting, implemented as a restricted additive Schwarz type algorithm, is extended for the solution of regularized least squares problems. The presented non-stationary version of the algorithm uses dynamic updating of the weights applied to the subdomains in reconstituting the global solution. Standard convergence results follow from extensive prior literature on linear mult...
متن کاملIterative Scaled Trust-Region Learning in Krylov Subspaces via Pearlmutter's Implicit Sparse Hessian-Vector Multiply
The online incremental gradient (or backpropagation) algorithm is widely considered to be the fastest method for solving large-scale neural-network (NN) learning problems. In contrast, we show that an appropriately implemented iterative batch-mode (or block-mode) learning method can be much faster. For example, it is three times faster in the UCI letter classification problem (26 outputs, 16,00...
متن کاملA Class of Nested Iteration Schemes for Generalized Coupled Sylvester Matrix Equation
Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Matrix Analysis Applications
دوره 34 شماره
صفحات -
تاریخ انتشار 2013